EDA News (& Monday March 15, 2004 From: EDACafe ÿÿ Previous Issues _____ http://www.mentor.com/dft _____ About This Issue (& TEST & ATE - Cost of Test _____ March 8-12, 2004 By Dr. Jack Horgan Read business product alliance news and analysis of weekly happenings _____ ADVERTISEMENT Cadence While the manufacturing costs per transistor have plummeted, testing costs per transistor have remained relatively level. Among the reasons for this phenomenon are increasing gate counts cause longer test sequences; a mix of analog, digital, and memory circuitry requires different test strategies; higher operating frequencies make at-speed test difficult; and an increasing number of functional levels requires more complex test sequences. The Cost of Test is simplistically given by the formula. COT = (Capital + Operating costs)/ (Yield x Utilization x Throughput) Capital and operational costs include depreciation of acquisition costs of the ATE machine, handlers and probes, amortization of facility modifications, maintenance and spares, facilities, indirect materials and consumables, labor and overhead. Equipment utilization is the percentage of the time in production and excludes non-production uses such as engineering use, maintenance and repair and idle time. Yield is the percentage passing the test, while throughput is the number of parts tested per unit time. At a purchase price in excess of $1M, depreciation is usually the dominant factor. It is often said that the cheapest ATE machine, is the one that is fully depreciated. This cost model is for the time a chip spends "in-socket" on the ATE, it does not include the costs for test program development, simulation, and debug. It does not include costs to diagnose and address the root causes of both detected and undetected defects and their impact (yield loss) during this time. Nor does it include the impact of "defect escapees". An undetected defect at one level of integration translates into a probability of faults at the next level of integration of the part The other side of the cost coin is time as in time to market (TTM), time to volume (TTV) and time to profit (TTP). To the extent that any test related activity lies on the critical path, it has an effect on these important metrics. In an era of increasingly shorter product lifetimes, any delay can impact the size of the remaining market, a firm's share of the available market and its margin. Reducing price is a challenge for ATE vendors given their relatively small volumes coupled with pressure to align their R&D efforts with rapidly evolving technologies. However, ATE vendors have successfully developed ways (multisite and concurrent testing) to increase throughput by introducing parallelism into what had been a serial process. Multisite capabilities enable two or more devices under test (DUTs) to be tested simultaneously. The number of sites is much greater for memory chips than logic chips. With concurrent testing the parallelism is inside the device itself. An SoC contains several embedded cores (predesigned and verified but untested blocks - logic, memory, analog) requiring different test strategies and relying on core-vendor supplied test. Concurrent testing enables the simultaneous testing of multiple cores. The greatest degree of parallelism is the combination of the two approaches. Parallelism offers the greatest costs saving per device. Doubling the number of devices being tested simultaneously simplistically halves the cost, while a 10% to 15% reduction in ATE machine purchase price might result in only a 5% COT saving. Some have suggested that some new standard might be part of the solution. In fact, there already are some IEEE standards in the test arena including JTAG boundary scan, STIL (Standard Test Interface Language) and CTL (Core Test description Language). In July 2002 at Semicon West Advantest Corporation with support from Intel created a stir by announcing its plans to establish the Semiconductor Test Consortium (STC), a non-profit industry wide collaboration to develop and to proliferate a Semiconductor Test Open Architecture. By July 2003 STC had released the first two drafts of the specification for OPENSTAR - the Open Semiconductor Test. Twenty three members are listed on the website www.semitest.org including Intel, Motorola, Fujitsu and Phillips Semiconductor. Other than founding member Advantest no other ATE vendor is a member. In his keynote address at the International Test Conference Oct 2002 Alex d'Arbeloff, founder of Teradyne commented ": basically there is a fundamental switch in the test industry - from one that relies on technology innovation where products are bought based on performance, to one where commodity products are bought on lowest price,". He opposes an industry wide open architecture infrastructure that would allow various modules to be mixed and matched in any test vendor's standardized rack. "That approach won't work in the ATE industry, where the value that is offered to test equipment users is the system integration done by ATE vendors, which otherwise users would have to pay," he said. In a phone conversation Sergio Perez Vice Chairman of STC acknowledges that it would be difficult for ATE vendors to "jump on board" at this time. It would take 18 to 24 months to develop a compliant product. Such a move would undermine their current product offerings. He sees many smaller module and instrumentation vendors benefiting from this open standard. Wayne Lonowski, EDA/DFT Marketing Manager at Agilent, sees the greatest benefit for end users from standardized interfaces that make the information flow seamless. He considers the benefits for standardized architecture to be debatable. In the meantime, the ATE industry, both vendors and customers, continues to gravitate toward single-platform systems that have an open architecture in the sense that they are open to third-party instrument suppliers; even as the architecture itself is proprietary. The traditional approach to testing was functional or behavioral testing. Functional testing allows the testing of a very large number of "actual functional paths" at speed using millions of vectors in a few milliseconds. The problem is the number of input/output combinations is rising exponentially and with it testing time. Sophisticated algorithms, e.g. redundancy removal, have been developed to reduce number of test vectors. A different approach is structural testing which seeks to determine whether or not a particular physical failure has occurred. Fault models define the properties of the tests that will detect the faulty behavior caused by defects. The most common fault models are the single-stuck-at DC model, the transition and path delay AC models, and current measurement models. Once a set of faults is modeled, tests can be generated to differentiate the circuit with faults from the circuit without any faults. A popular approach to the ever-increasing cost-of-test problem is to apply Design-for-Test (DFT) techniques to the device during the design phase. These techniques are designed to detect specific types of faults in the IC, and they all require the addition of circuitry and/or adherence to particular design rules during the design process. DFT techniques provide a high degree of access, controllability, and observability to the internals of the design with reduced test pin count. DFT-related test techniques include internal scan, boundary scan, IDDQ and BIST. Scan is a technique for converting sequential design elements (i.e. flip-flops and latches) into control and observation points. A scan chain is a connection of sequential elements converted into a shift register, which provides additional observe and control points to internal nodes in a design. Boundary scan inserts a ring of observe and control cells around a chip's functional pins and control ports. IDDQ testing involves detecting faults by measurement of very small power supply currents when the device has been put into a quiescent state. There is an order of magnitude difference for defect-free CMOS circuits and those containing a silicon defect. BIST (Build-In-Self-Test) is circuitry added to an IC design to verify the structural integrity of that design using on-chip stimulus and on-chip response analysis. The on-chip circuitry generates a sufficient number of pseudorandom test patterns (PRPG) to provide a high probability that the total pattern set will exercise each potential defect and yield a telltale output signature that indicates whether the circuitry is good or bad. All that is necessary from external ATE is a signal to initiate the test and the ability to respond to BIST pass/fail results. There is some debate as to whether BIST provides the same coverage as Scan/APTG. The trend is toward increasing test data volumes which increases test application times. Most major test tool vendors offer deterministic compression schemes that work in conjunction with ATPG algorithms to significantly reduce the scan test data volume, scan test time and vector-buffer-memory requirements (avoid costly reloads). On chip circuitry decompresses the full vector set for test execution and compresses test results for transmission back to the ATE for pass/fail determination and fault diagnostics. Another parallelism technique originally introduced by Syntest is multiple-capture-per-cycle which reduces the total test time of designs with multiple-frequency clock domains by deploying only one scan-in/scan-out phase for each pattern. There are costs for using DFT techniques in terms of overhead for chip area, yield reduction and performance impact as well as the cost for acquiring DFT tools and learning how to effectively employ them. The benefits are generally conceded to outweigh these costs. DFT techniques and H/W parallelism can be leveraged to perform more tests in a given time period (increased quality), to test more devices in a given time period (greater throughput), or to optimize the cost/quality tradeoff. Also less expensive machines requiring less memory can be used. There has been much discussion about low cost DFT oriented testers. There are a few VC funded start up companies (Teseda and Inovsys) in this niche market. Andrew Levy, Director of Marketing for Teseda, says their product is targeted at the engineer. They offer a desktop unit (~$60K) as a productivity tool for DFT validation, debugging, and failure analysis instead of using heavily scheduled production ATE machines. The system is closely coupled bi-directionally with EDA tools via STIL. Inovys offers both a desktop test system for engineering teams and a production system. CEO Paul Sakamoto says its value proposition is not a lower cost tester but rather productivity, not just of the engineer or tester but of the entire operation in terms of time-to-market and in terms of shorter yield learning time. Synopsys is an investor in both firms. Agilent has also introduced a lower cost DFT Tester. None of these systems have sold in significant volumes to date. If a firm were to purchase a single ATE machine, that machine would have to cover all of its testing requirements. A DFT tester alone would not suffice in the case of mixed signal devices, since the analog component is the source of most defects. The top 10 IDMs and top 10 assembly/test contractors probably account for 80% of ATE installations. These firms have large numbers of machines that remain in use for years. Since a device is tested several times in the course of its manufacture, it is possible to conceive of a scenario whereby a specific test is DFT only. The question then becomes whether it is more cost efficient to use the same class of machine for all test scenarios or to have special purpose testers. Another alternative is to use older ATE machines still in inventory or to purchase refurbished equipment. Paul Sakamoto questions whether legacy machines have the proper architecture to do all that DFT oriented testers can do or at least as efficiently. He uses the analogy of an F16 and a helicopter. Both are flying machines but only one can take off and land vertically. While ATE machines can be used as simple pass fail devices, they are capable of collecting, formatting and communicating a lot of valuable test data particularly for failing chips both from manufacturing and the field. Data can be sent back to APTG tools where failure diagnostics tools can be used to identify gates likely to have caused the failure and from there the location in the physical layout. As geometries shrink systematic as opposed to random defects are increasing likely to involve design issues. The International Technology Roadmap for Semiconductors (ITRS) 2003 in the Section on Test & Equipment comments: "Looking forward, test development time and cost will be reduced further by DFT techniques, test standards (i.e., to support test content reuse, test program inter-operability and manufacturing agility), automatic generation of test patterns (i.e., structural test approaches), and consideration of testability issues earlier in the design process. Structural test is becoming an industry wide practice, but will not replace functional test in most product segments in the near-term. DFT is mainstream in high-end digital logic designs and penetration into analog and SOC designs will commence in the near term. DFT techniques will be used to increase throughput and/or utilization of tester resources, like digital test data compression techniques, bandwidth matching, and DFT that enables the testing of multiple cores concurrently (e.g., ADC, DAC, digital, and memory cores). ATE will have capabilities supporting EDA/DFT features, like for example capabilities to test multiple cores concurrently for a given site and straightforward communication between EDA and ATE environments (e.g., for data logging). The reduction in test time will partly be used to apply new deep sub-micron fault models as may be required to keep up test quality levels as technology progresses. DFT techniques will enable the use of lower cost lower capability equipment and reuse of existing equipment. For certain performance points and segments, dedicated low cost equipment is economically justified and will continue to be architected." As the very name Design for Test implies, these techniques must be introduced in the early design phase. There are even to tools to analyze the testability of a design. The leading EDA vendors offer complete suites of DFT tools fully integrated into their design flows ( Cadence Encounter Test Solutions , Mentor Design for Test , and Synopsys Galaxy Test Automation ). There are also two much smaller firms (<$10M revenue) that concentrate on test products and services ( LogicVision's Embedded Test Solutions and Syntest Solutions ). Appendix: ATE Vendors The semiconductor industry is highly concentrated, and a small number of integrated circuit device manufacturers and assembly and test subcontractors account for a substantial portion of the purchases of integrated circuit test equipment. The downturn in the semiconductor industry affected the test equipment market more significantly than the overall capital equipment sector. The impact of this slowdown was magnified due to the high proportion of fixed costs in the industry, including significant research and development, manufacturing and sales costs. The main driver of customers' business has visibly shifted from communication, IT, networking, and internet infrastructure to consumers who are driving the market for PC's, cell phones, electronic games, and similar consumer electronics. The major players in the ATE industry are in alphabetical order Advantest Corporation, Agilent, Credence+NPTest, LTX and Teradyne, Inc. On February 23rd Credence announced an agreement to acquire NPTest. Rev $M 2000 2003 03 vs 00 Teradyne 3,043 1,352 -55.6% Credence 757 182 -76.0% NPTest 286 231 -19.2% Credence+ NPTest 1,043 413 -60.4% Agilent 9,361 6,056 -35.3% Agilent 9,361 6,056 -35.3% Agilent ATE 1,030 885 -14.1% LTX 381 110 -71.1% Advantest m? 262,000 170,000 -35.1% Advantest ATE 227,000 158,000 -30.4% Adv ATE $ ÿ 830 ÿ Table 1 ATE Vendor Revenues (Advantest revenues are for fiscal years ending March 31) Most ATE vendors had significant net income in 2000 but have lost substantial amounts in the years since. The industry is more upbeat about recent performance and the future. (& A New Methodology for Interconnect Design NEW CADENCE ALLEGRO PLATFORM DELIVERS ON-TARGET, ON-TIME, HIGH-SPEED SYSTEM INTERCONNECT DESIGN On March 8th Cadence announced the new Cadence Allegro System Internconnect Design Platform to optimize and accelerate high-performance, high-density interconnect design. This is the fourth platform introduced in the last 18 months. At the heart of this methodology is a virtual system interconnect (VSIC) model defined by Cadence that describes the physical, logical, and electrical properties of the system interconnect. The VSIC model is used to capture the original design intent and is matured throughout the design process as various segments of the interconnect are implemented. This new methodology enables engineers to co-design an IC, its package, and the PCB simultaneously. Designers will be able to collaborate across design domains, make system level tradeoffs, and update the VSIC model so all designers in the flow are working with the same data can dramatically reduce iterations and help ensure first-time design success. Many companies will be able to remove weeks or months from the design process with the combination of best practices and deployment of the co-design methodology and associated technology platform. Weekly Industry News Highlights Leading EDA Vendors Get on Board With MAX II Devices Verisity Grows Lead in the Functional Verification Automation Market to More Than 2X Mentor Graphics Expands Higher Education Program in China to Help Cultivate IC Design Growth Celoxica Introduces High-Performance Evaluation Boards; RC300 and RC2000Pro Offer High Performance and Flexibility for Designers of Image Processing and Data Streaming Applications Altera's New MAX II CPLD Family Delivers Dramatic Reduction in Cost and Power Consumption Industry SEMI Reports 2003 Global Semiconductor Equipment Sales of $22.2 Billion IDC Forecasts 18% PC Semiconductor Revenue Growth in 2004 Upcoming Events... --Contributing Editors can be reached by clicking here . You are registered as: [dolinsky@gsu.by]. CafeNews is a service for EDA professionals. EDACafe respects your online time and Internet privacy. To change your newsletter's details, including format and frequency, or to discontinue this service, please navigate to . If you have questions about EDACafe services, please send email to edaadmin@ibsystems.com . Copyright c 2004, Internet Business Systems, Inc. - 11208 Shelter Cove, Smithfield, VA 23420 - 888-44-WEB-44 - All rights reserved.